2,561 research outputs found

    Realistic atomistic structure of amorphous silicon from machine-learning-driven molecular dynamics

    Get PDF
    Amorphous silicon (a-Si) is a widely studied noncrystalline material, and yet the subtle details of its atomistic structure are still unclear. Here, we show that accurate structural models of a-Si can be obtained using a machine-learning-based interatomic potential. Our best a-Si network is obtained by simulated cooling from the melt at a rate of 1011 K/s (that is, on the 10 ns time scale), contains less than 2% defects, and agrees with experiments regarding excess energies, diffraction data, and 29Si NMR chemical shifts. We show that this level of quality is impossible to achieve with faster quench simulations. We then generate a 4096-atom system that correctly reproduces the magnitude of the first sharp diffraction peak (FSDP) in the structure factor, achieving the closest agreement with experiments to date. Our study demonstrates the broader impact of machine-learning potentials for elucidating structures and properties of technologically important amorphous materials

    On the calculation of the bandgap of periodic solids with MGGA functionals using the total energy

    Get PDF
    During the last few years, it has become more and more clear that functionals of the meta generalized gradient approximation (MGGA) are more accurate than GGA functionals for the geometry and energetics of electronic systems. However, MGGA functionals are also potentially more interesting for the electronic structure, in particular, when the potential is nonmultiplicative (i.e., when MGGAs are implemented in the generalized Kohn-Sham framework), which may help to get more accurate bandgaps. Here, we show that the calculation of bandgap of solids with MGGA functionals can also be done very accurately in a non-self-consistent manner. This scheme uses only the total energy and can, therefore, be very useful when the self-consistent implementation of a particular MGGA functional is not available. Since self-consistent MGGA calculations may be difficult to converge, the non-self-consistent scheme may also help to speed up the calculations. Furthermore, it can be applied to any other types of functionals, for which the implementation of the corresponding potential is not trivial

    Atomic-scale representation and statistical learning of tensorial properties

    Full text link
    This chapter discusses the importance of incorporating three-dimensional symmetries in the context of statistical learning models geared towards the interpolation of the tensorial properties of atomic-scale structures. We focus on Gaussian process regression, and in particular on the construction of structural representations, and the associated kernel functions, that are endowed with the geometric covariance properties compatible with those of the learning targets. We summarize the general formulation of such a symmetry-adapted Gaussian process regression model, and how it can be implemented based on a scheme that generalizes the popular smooth overlap of atomic positions representation. We give examples of the performance of this framework when learning the polarizability and the ground-state electron density of a molecule

    Gaussian Approximation Potentials: the accuracy of quantum mechanics, without the electrons

    Get PDF
    We introduce a class of interatomic potential models that can be automatically generated from data consisting of the energies and forces experienced by atoms, derived from quantum mechanical calculations. The resulting model does not have a fixed functional form and hence is capable of modeling complex potential energy landscapes. It is systematically improvable with more data. We apply the method to bulk carbon, silicon and germanium and test it by calculating properties of the crystals at high temperatures. Using the interatomic potential to generate the long molecular dynamics trajectories required for such calculations saves orders of magnitude in computational cost.Comment: v3-4: added new material and reference

    Machine Learning Interatomic Potentials as Emerging Tools for Materials Science.

    Get PDF
    Atomic-scale modeling and understanding of materials have made remarkable progress, but they are still fundamentally limited by the large computational cost of explicit electronic-structure methods such as density-functional theory. This Progress Report shows how machine learning (ML) is currently enabling a new degree of realism in materials modeling: by "learning" electronic-structure data, ML-based interatomic potentials give access to atomistic simulations that reach similar accuracy levels but are orders of magnitude faster. A brief introduction to the new tools is given, and then, applications to some select problems in materials science are highlighted: phase-change materials for memory devices; nanoparticle catalysts; and carbon-based electrodes for chemical sensing, supercapacitors, and batteries. It is hoped that the present work will inspire the development and wider use of ML-based interatomic potentials in diverse areas of materials research.Academy of Finland under project #310574. The authors are thankful for generous allocation of computational resources on the ARCHER UK National Supercomputing Service (EPSRC grants EP/K014560/1 and EP/P022596/1) and by CSC ‐ IT Center for Science, Finland, which supported some of the work discussed herein. V.L.D. and M.A.C. are grateful for mutual HPC‐Europa3 exchange visits (funded by the European Union's Horizon 2020 research and innovation programme under grant agreement No. 730897), during one of which this manuscript was finalized

    Gaussian Approximation Potentials: theory, software implementation and application examples

    Full text link
    Gaussian Approximation Potentials are a class of Machine Learned Interatomic Potentials routinely used to model materials and molecular systems on the atomic scale. The software implementation provides the means for both fitting models using ab initio data and using the resulting potentials in atomic simulations. Details of the GAP theory, algorithms and software are presented, together with detailed usage examples to help new and existing users. We review some recent developments to the GAP framework, including MPI parallelisation of the fitting code enabling its use on thousands of CPU cores and compression of descriptors to eliminate the poor scaling with the number of different chemical elements

    Building nonparametric nn-body force fields using Gaussian process regression

    Full text link
    Constructing a classical potential suited to simulate a given atomic system is a remarkably difficult task. This chapter presents a framework under which this problem can be tackled, based on the Bayesian construction of nonparametric force fields of a given order using Gaussian process (GP) priors. The formalism of GP regression is first reviewed, particularly in relation to its application in learning local atomic energies and forces. For accurate regression it is fundamental to incorporate prior knowledge into the GP kernel function. To this end, this chapter details how properties of smoothness, invariance and interaction order of a force field can be encoded into corresponding kernel properties. A range of kernels is then proposed, possessing all the required properties and an adjustable parameter nn governing the interaction order modelled. The order nn best suited to describe a given system can be found automatically within the Bayesian framework by maximisation of the marginal likelihood. The procedure is first tested on a toy model of known interaction and later applied to two real materials described at the DFT level of accuracy. The models automatically selected for the two materials were found to be in agreement with physical intuition. More in general, it was found that lower order (simpler) models should be chosen when the data are not sufficient to resolve more complex interactions. Low nn GPs can be further sped up by orders of magnitude by constructing the corresponding tabulated force field, here named "MFF".Comment: 31 pages, 11 figures, book chapte

    Machine-learning of atomic-scale properties based on physical principles

    Full text link
    We briefly summarize the kernel regression approach, as used recently in materials modelling, to fitting functions, particularly potential energy surfaces, and highlight how the linear algebra framework can be used to both predict and train from linear functionals of the potential energy, such as the total energy and atomic forces. We then give a detailed account of the Smooth Overlap of Atomic Positions (SOAP) representation and kernel, showing how it arises from an abstract representation of smooth atomic densities, and how it is related to several popular density-based representations of atomic structure. We also discuss recent generalisations that allow fine control of correlations between different atomic species, prediction and fitting of tensorial properties, and also how to construct structural kernels---applicable to comparing entire molecules or periodic systems---that go beyond an additive combination of local environments

    Nested sampling for materials: the case of hard spheres

    Get PDF
    The recently introduced nested sampling algorithm allows the direct and efficient calculation of the partition function of atomistic systems. We demonstrate its applicability to condensed phase systems with periodic boundary conditions by studying the three dimensional hard sphere model. Having obtained the partition function, we show how easy it is to calculate the compressibility and the free energy as functions of the packing fraction and local order, verifying that the transition to crystallinity has a very small barrier, and that the entropic contribution of jammed states to the free energy is negligible for packing fractions above the phase transition. We quantify the previously proposed schematic phase diagram and estimate the extent of the region of jammed states. We find that within our samples, the maximally random jammed configuration is surprisingly disordered
    corecore